Shares

A recent editorial in the Journal of the American Medical Association (JAMA) addresses the growing problem of medical misinformation, such as anti-vaccine views, fake treatments, unproven alternative products and services, and others. The authors, Armstrong and Naylor, make some good recommendations, but unfortunately are about 20 years behind the times when it comes to confronting scientific misinformation.

They identify the source of the problem as being primarily the extensive availability of medical information online that is not vetted or reviewed, combined with a rising tide of anti-science sentiments in the culture. They write:

This new online world facilitates direct-to-consumer marketing by phony experts, celebrities with armies of Twitter followers, and legions of independent digital scammers, including some physicians. The result has been torrents of misinformation on topics as varied as the safety and effectiveness of vaccinations, the Zika virus outbreak, water fluoridation, genetically modified foods, and treatments for common diseases.

This is all correct, but leaves out several important factors. We could also focus on tolerance of complementary and alternative medicine in academia, education, in the medical profession more broadly, and increasingly in regulations. CAM is a huge problem because at its core it is based on breaking the rules of science, promoting conspiratorial and anti-science thinking, and loosening regulations.

Another factor to consider are government regulations that should optimally be science based but increasingly are not. Also the culture of academia needs to change, to value and promote the public understanding of medical science. They do touch on this, but do not address the underlying problem with academic culture.

They talk a lot about scientific literacy, which means they are stuck in the decades-old “knowledge deficit” model of pseudoscience. Lack of scientific knowledge is an important factor, but is probably less of one that lack of critical thinking skills. Facts don’t matter if you are stuck in a conspiracy mindset, or do not trust science, or are overwhelmed with motivated reasoning and confirmation bias.

In fact, embracing medical pseudoscience increases as general education level increases, probably due to increased access to disposable income. But this shows that education itself is not an inoculation against pseudoscience, as the authors believe.

Because their list of contributing factors is too limited, their solutions are also limited – they are good, just not nearly enough. The primary focus of their paper is what medical journals can do to address the problem. They offer four major solutions: containment of dissemination, “general immunity through scientific literacy”, health-specific education, and debunking myths.

These are all good things, and journals should absolutely be doing them, but again – they will not be nearly enough.

I think that containment of dissemination is probably a lost cause. That genie is out of the bottle. Perhaps big social media outlets like Facebook can change their algorithms so that sensational nonsense is not promoted, but short of a huge change in social media and the internet allowing for a China-style top-down control, I don’t think we are going to stop misinformation. We simply have to deal with it.

And again, they make some good recommendations, but mostly limited to the knowledge-deficit model. Journals should provide executive summaries of technical articles written for a broad lay audience. They could also provide forums in which experts can interface with the public pertaining to specific articles or topics. They recommend topic specific journals and issues tackling topics of public interest or that are a common target of misinformation. Journals and experts can also directly debunk myths and misinformation.

This is all good, but here are some further recommendations (some of which go beyond the scope of medical journals). I will start, however, with what journals can do.

The culture of academia and science needs to shift away from publishing lots of preliminary studies. The threshold for being published needs to be higher, with better statistics, better editorial review, internal replications, checks against p-hacking, and more exact replications.

Journals that are primarily dedicated to pseudoscience should not be listed as formally peer-reviewed. Basically every specialty journal dedicated to CAM should not be listed as legitimate, unless they have editors that are dedicated to science itself (not CAM) and have extremely high standards. But we cannot just let CAM proponents be their own experts and decide what is valid. That is just opening the floodgates to pseudoscience.

The entire idea of open-access also needs to be rethought. Perhaps in order to be listed as a legitimate peer-reviewed journal, certain standards need to be put in place that address both the problems of open-access journals and traditional journals. The open-access model encourages publishing lots of poor quality articles because they charge per publication. Traditional journals, rather, are chasing impact factor, which favors dramatic studies that are the most likely to be wrong and later retracted.

We need to find a model that allows for public access of important articles, or at least the lay summaries, while rewarding editorial quality (not the sensationalism of the results).

Academia and medical journals also need to get more involved in promoting rational government regulations of all aspects of health care.

But there are also factors outside the control of journal editors that also need to be addressed. The entire science education infrastructure needs an overhaul. Critical thinking needs to be incorporated from day one. Students need to develop the skills necessary to tell science from pseudoscience, to evaluate claims and sources, to be skeptical of unusual claims, and to recognize the pitfalls of conspiracy thinking and other cognitive biases.

Confronting fraud, pseudoscience, science denial, and misinformation needs to be a top priority for the academic, scientific, and medical communities. This needs to also go beyond just confronting what appears to be a short term crisis. This is forever. We need a culture change where these priorities are recognized as intrinsic to the scientific enterprise.

Shares

Author

  • Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.

    View all posts

Posted by Steven Novella

Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.